9 research outputs found

    Event-based Asynchronous Sparse Convolutional Networks

    Full text link
    Event cameras are bio-inspired sensors that respond to per-pixel brightness changes in the form of asynchronous and sparse "events". Recently, pattern recognition algorithms, such as learning-based methods, have made significant progress with event cameras by converting events into synchronous dense, image-like representations and applying traditional machine learning methods developed for standard cameras. However, these approaches discard the spatial and temporal sparsity inherent in event data at the cost of higher computational complexity and latency. In this work, we present a general framework for converting models trained on synchronous image-like event representations into asynchronous models with identical output, thus directly leveraging the intrinsic asynchronous and sparse nature of the event data. We show both theoretically and experimentally that this drastically reduces the computational complexity and latency of high-capacity, synchronous neural networks without sacrificing accuracy. In addition, our framework has several desirable characteristics: (i) it exploits spatio-temporal sparsity of events explicitly, (ii) it is agnostic to the event representation, network architecture, and task, and (iii) it does not require any train-time change, since it is compatible with the standard neural networks' training process. We thoroughly validate the proposed framework on two computer vision tasks: object detection and object recognition. In these tasks, we reduce the computational complexity up to 20 times with respect to high-latency neural networks. At the same time, we outperform state-of-the-art asynchronous approaches up to 24% in prediction accuracy

    Memristor devices for neural networks

    No full text
    Neural network technologies have taken center stage owing to their powerful computing capability for supporting deep learning in artificial intelligence. However, conventional synaptic devices such as SRAM and DRAM are not satisfactory solutions for neural networks. Recently, several types of memristor devices have become popular alternatives because of their outstanding characteristics such as scalability, high performance, and non-volatility. To understand the characteristics of memristors, a comparison among memristors has been made, considering both maturity and performance. Magneto-resistance random access memory, phase-change random access memory, and resistive random access memory among the proposed memristors are good candidates as synaptic devices for weight storage and matrixvector multiplication required in artificial neural networks (ANNs). Moreover, these devices play key roles as synaptic devices in research for bio-plausible spiking neural networks (SNNs) because their distinctive switching properties are well matched for emulating synaptic and neuron functions of biological neural networks. In this paper we review motivation, advantage, technology, and applications of memristor devices for neural networks from practical approaches of ANNs to futuristic research of SNNs, considering the current status of memristor technology
    corecore